Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve async processor handling enabled items, optimize code further #5686

Merged

Conversation

yeya24
Copy link
Contributor

@yeya24 yeya24 commented Nov 30, 2023

What this PR does:

Refactored multilevel cache code to integrate it better with cache items filtering.
By filtering caches in multilevel cache itself, we can avoid putting unnecessary tasks into the async processor queue.

We observed that when backfilling a large amount of items (10M+), even if the operation is async, putting all items into the backfill queue can already take 20s+.

This change also refactor the async queue operation to not enqueue items one by one, instead we enqueue the whole items that need backfilling, and we have a cap to avoid backfilling too much data

Which issue(s) this PR fixes:
Fixes #

Checklist

  • Tests updated
  • Documentation added
  • CHANGELOG.md updated - the order of entries should be [CHANGE], [FEATURE], [ENHANCEMENT], [BUGFIX]

@yeya24 yeya24 force-pushed the improve-async-multilevel-cache-filter branch from 6b09fb2 to f0dc7e6 Compare November 30, 2023 06:58
Signed-off-by: Ben Ye <[email protected]>
@yeya24 yeya24 requested a review from alanprot November 30, 2023 17:19
@@ -76,14 +81,23 @@ func (m *multiLevelCache) FetchMultiPostings(ctx context.Context, blockID ulid.U
backFillTimer := prometheus.NewTimer(m.backFillLatency.WithLabelValues(cacheTypePostings))
defer backFillTimer.ObserveDuration()
for i, values := range backfillItems {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we put this also on the "async task"?

if len(enabledItems[0]) == 0 {
return c[0]
}
return storecache.NewFilteredIndexCache(c[0], enabledItems[0])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we still need this type? we cannot just use a multi level with 1 level?

@yeya24 yeya24 merged commit aa5aca5 into cortexproject:master Nov 30, 2023
14 checks passed
@yeya24 yeya24 deleted the improve-async-multilevel-cache-filter branch November 30, 2023 22:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants